3,595 research outputs found

    Rising Wage Inequality: The Role of Composition and Prices

    Get PDF
    During the early 1980s, earnings inequality in the U.S. labor market rose relatively uniformly throughout the wage distribution. But this uniformity gave way to a significant divergence starting in 1987, with upper-tail (90/50) inequality rising steadily and lower tail (50/10) inequality either flattening or compressing for the next 16 years (1987 to 2003). This paper applies and extends a quantile decomposition technique proposed by Machado and Mata (2005) to evaluate the role of changing labor force composition (in terms of education and experience) and changing labor market prices to the expansion and subsequent divergence of upper- and lower-tail inequality over the last three decades We show that the extended Machado-Mata quantile decomposition corrects shortcomings of the original Juhn-Murphy-Pierce (1993) full distribution accounting method and nests the kernel reweighting approach proposed by DiNardo, Fortin and Lemieux (1996). Our analysis reveals that shifts in labor force composition have positively impacted earnings inequality during the 1990s. But these compositional shifts have primarily operated on the lower half of the earnings distribution by muting a contemporaneous, countervailing lower-tail price compression. The steady rise of upper tail inequality since the late 1970s appears almost entirely explained by ongoing between-group price changes (particularly increasing wage differentials by education) and residual price changes.

    The Polarization of the U.S. Labor Market

    Get PDF
    This paper analyzes a marked change in the evolution of the U.S. wage structure over the past fifteen years: divergent trends in upper-tail (90/50) and lower-tail (50/10) wage inequality. We document that wage inequality in the top half of distribution has displayed an unchecked and rather smooth secular rise for the last 25 years (since 1980). Wage inequality in the bottom half of the distribution also grew rapidly from 1979 to 1987, but it has ceased growing (and for some measures actually narrowed) since the late 1980s. Furthermore we find that occupational employment growth shifted from monotonically increasing in wages (education) in the 1980s to a pattern of more rapid growth in jobs at the top and bottom relative to the middles of the wage (education) distribution in the 1990s. We characterize these patterns as the %u201Cpolarization%u201D of the U.S. labor market, with employment polarizing into high-wage and low-wage jobs at the expense of middle-wage work. We show how a model of computerization in which computers most strongly complement the non-routine (abstract) cognitive tasks of high-wage jobs, directly substitute for the routine tasks found in many traditional middle-wage jobs, and may have little direct impact on non-routine manual tasks in relatively low-wage jobs can help explain the observed polarization of the U.S. labor market.

    Trends in U. S. Wage Inequality: Re-Assessing the Revisionists

    Get PDF
    A large literature documents a substantial rise in U. S. wage inequality and educational wage differentials over the past several decades and finds that these trends can be primarily accounted for by shifts in the supply of and demand for skills reinforced by the erosion of labor market institutions affecting the wages of low- and middle-wage workers. Drawing on an additional decade of data, a number of recent contributions reject this consensus to conclude that (1) the rise in wage inequality was an “episodic” event of the first-half of the 1980s rather than a "secular” phenomenon, (2) this rise was largely caused by a falling minimum wage rather than by supply and demand factors; and (3) rising residual wage inequality since the mid-1980s is explained by confounding effects of labor force composition rather than true increases in inequality within detailed demographic groups. We reexamine these claims using detailed data from the Current Population Survey and find only limited support. Although the growth of overall inequality in the U. S. slowed in the 1990s, upper tail inequality rose almost as rapidly during the 1990s as during the 1980s. A decomposition applied to the CPS data reveals large and persistent rise in within-group earnings inequality over the past several decades, controlling for changes in labor force composition. While changes in the minimum wage can potentially account for much of the movement in lower tail earnings inequality, strong time series correlations of the evolution of the real minimum wage and upper tail wage inequality raise questions concerning the causal interpretation of such relationships. We also find that changes in the college/high school wage premium appear to be well captured by standard models emphasizing rapid secular growth in the relative demand for skills and fluctuations in the rate of growth of the relative supply of college workers – though these models do not accurately predict the slowdown in the growth of the college/high-school gap during the 1990s. We conclude that these patterns are not adequately explained by either a ‘unicausal’ skill-biased technical change explanation or a revisionist hypothesis focused primarily on minimum wages and mechanical labor force compositional effects. We speculate that these puzzles can be partially reconciled by a modified version of the skill-biased technical change hypothesis that generates a polarization of skill demands.

    Trends in U.S. Wage Inequality: Re-Assessing the Revisionists

    Get PDF
    A recent "revisionist " literature characterizes the pronounced rise in U.S. wage inequality since 1980 as an "episodic " event of the first-half of the 1980s driven by non-market factors (particularly a falling real minimum wage) and concludes that continued increases in wage inequality since the late 1980s substantially reflect the mechanical confounding effects of changes in labor force composition. Analyzing data from the Current Population Survey for 1963 to 2005, we find limited support for these claims. The slowing of the growth of overall wage inequality in the 1990s hides a divergence in the paths of upper-tail (90/50) inequality -- which has increased steadily since 1980, even adjusting for changes in labor force composition -- and lower tail (50/10) inequality, which rose sharply in the first-half of the 1980s and plateaued or contracted thereafter. Fluctuations in the real minimum wage are not a plausible explanation for these trends since the bulk of inequality growth occurs above the median of the wage distribution. Models emphasizing rapid secular growth in the relative demand for skills -- attributable to skill-biased technical change -- and a sharp deceleration in the relative supply of college workers in the 1980s do an excellent job of capturing the evolution of the college/high-school wage premium over four decades. But these models also imply a puzzling deceleration in relative demand growth for college workers in the early 1990s, also visible in a recent "polarization" of skill demands in which employment has expanded in high-wage and low-wage work at the expense of middle-wage jobs. These patterns are potentially reconciled by a modified version of the skill-biased technical change hypothesis that emphasizes the role of information technology in complementing abstract (high-education) tasks and substituting for routine (middle-education) tasks.

    Rising Wage Inequality: The Role of Composition and Prices

    Get PDF
    During the early 1980s, earnings inequality in the U. S. labor market rose relatively uniformly throughout the wage distribution. But this uniformity gave way to a significant divergence starting in 1987, with upper-tail (90/50) inequality rising steadily and lower tail (50/10) inequality either flattening or compressing for the next 16 years (1987 to 2003). This paper applies and extends a quantile decomposition technique proposed by Machado and Mata (2005) to evaluate the role of changing labor force composition (in terms of education and experience) and changing labor market prices to the expansion and subsequent divergence of upper- and lower-tail inequality over the last three decades We show that the extended Machado-Mata quantile decomposition corrects shortcomings of the original Juhn-Murphy-Pierce (1993) full distribution accounting method and nests the kernel reweighting approach proposed by DiNardo, Fortin and Lemieux (1996). Our analysis reveals that shifts in labor force composition have positively impacted earnings inequality during the 1990s. But these compositional shifts have primarily operated on the lower half of the earnings distribution by muting a contemporaneous, countervailing lower-tail price compression. The steady rise of upper tail inequality since the late 1970s appears almost entirely explained by ongoing between-group price changes (particularly increasing wage differentials by education) and residual price changes.

    Data, Data Everywhere, and Still Too Hard to Link: Insights from User Interactions with Diabetes Apps

    Get PDF
    For those with chronic conditions, such as Type 1 diabetes, smartphone apps offer the promise of an affordable, convenient, and personalized disease management tool. How- ever, despite significant academic research and commercial development in this area, diabetes apps still show low adoption rates and underwhelming clinical outcomes. Through user-interaction sessions with 16 people with Type 1 diabetes, we provide evidence that commonly used interfaces for diabetes self-management apps, while providing certain benefits, can fail to explicitly address the cognitive and emotional requirements of users. From analysis of these sessions with eight such user interface designs, we report on user requirements, as well as interface benefits, limitations, and then discuss the implications of these findings. Finally, with the goal of improving these apps, we identify 3 questions for designers, and review for each in turn: current shortcomings, relevant approaches, exposed challenges, and potential solutions

    Cauchy, infinitesimals and ghosts of departed quantifiers

    Get PDF
    Procedures relying on infinitesimals in Leibniz, Euler and Cauchy have been interpreted in both a Weierstrassian and Robinson's frameworks. The latter provides closer proxies for the procedures of the classical masters. Thus, Leibniz's distinction between assignable and inassignable numbers finds a proxy in the distinction between standard and nonstandard numbers in Robinson's framework, while Leibniz's law of homogeneity with the implied notion of equality up to negligible terms finds a mathematical formalisation in terms of standard part. It is hard to provide parallel formalisations in a Weierstrassian framework but scholars since Ishiguro have engaged in a quest for ghosts of departed quantifiers to provide a Weierstrassian account for Leibniz's infinitesimals. Euler similarly had notions of equality up to negligible terms, of which he distinguished two types: geometric and arithmetic. Euler routinely used product decompositions into a specific infinite number of factors, and used the binomial formula with an infinite exponent. Such procedures have immediate hyperfinite analogues in Robinson's framework, while in a Weierstrassian framework they can only be reinterpreted by means of paraphrases departing significantly from Euler's own presentation. Cauchy gives lucid definitions of continuity in terms of infinitesimals that find ready formalisations in Robinson's framework but scholars working in a Weierstrassian framework bend over backwards either to claim that Cauchy was vague or to engage in a quest for ghosts of departed quantifiers in his work. Cauchy's procedures in the context of his 1853 sum theorem (for series of continuous functions) are more readily understood from the viewpoint of Robinson's framework, where one can exploit tools such as the pointwise definition of the concept of uniform convergence. Keywords: historiography; infinitesimal; Latin model; butterfly modelComment: 45 pages, published in Mat. Stu

    Voigt-Profile Analysis of the Lyman-alpha Forest in a Cold Dark Matter Universe

    Full text link
    We use an automated Voigt-profile fitting procedure to extract statistical properties of the Lyα\alpha forest in a numerical simulation of an Ω=1\Omega=1, cold dark matter (CDM) universe. Our analysis method is similar to that used in most observational studies of the forest, and we compare the simulations to recently published results derived from Keck HIRES spectra. With the Voigt-profile decomposition analysis, the simulation reproduces the large number of weak lines (N_{\rm HI}\la 10^{13}\cdunits) found in the HIRES spectra. The column density distribution evolves significantly between z=3z=3 and z=2z=2, with the number of lines at fixed column density dropping by a factor 1.6\sim 1.6 in the range where line blending is not severe. At z=3z=3, the bb-parameter distribution has a median of 35 \kms and a dispersion of 20 \kms, in reasonable agreement with the observed values. The comparison between our new analysis and recent data strengthens earlier claims that the \lya forest arises naturally in hierarchical structure formation as photoionized gas falls into dark matter potential wells. However, there are two statistically signficant discrepancies between the simulated forest and the HIRES results: the model produces too many lines at z=3z=3 by a factor 1.52\sim 1.5-2, and it produces more narrow lines (b<20 \kms) than are seen in the data. The first result is sensitive to our adopted normalization of the mean \lya optical depth, and the second is sensitive to our assumption that helium reionization has not significantly raised gas temperatures at z=3z=3. It is therefore too early to say whether these discrepancies indicate a fundamental problem with the high-redshift structure of the Ω=1\Omega=1 CDM model or reflect errors of detail in our modeling of the gas distribution or the observational procedure.Comment: 13 pages, 3 figures, AAS LaTex, accepted to Ap
    corecore